13 research outputs found

    Somatodendritic consistency check for temporal feature segmentation

    Get PDF
    The brain identifies potentially salient features within continuous information streams to process hierarchical temporal events. This requires the compression of information streams, for which effective computational principles are yet to be explored. Backpropagating action potentials can induce synaptic plasticity in the dendrites of cortical pyramidal neurons. By analogy with this effect, we model a self-supervising process that increases the similarity between dendritic and somatic activities where the somatic activity is normalized by a running average. We further show that a family of networks composed of the two-compartment neurons performs a surprisingly wide variety of complex unsupervised learning tasks, including chunking of temporal sequences and the source separation of mixed correlated signals. Common methods applicable to these temporal feature analyses were previously unknown. Our results suggest the powerful ability of neural networks with dendrites to analyze temporal features. This simple neuron model may also be potentially useful in neural engineering applications

    Learning Task-Related Activities From Independent Local-Field-Potential Components Across Motor Cortex Layers

    Get PDF
    Motor cortical microcircuits receive inputs from dispersed cortical and subcortical regions in behaving animals. However, how these inputs contribute to learning and execution of voluntary sequential motor behaviors remains elusive. Here, we analyzed the independent components extracted from the local field potential (LFP) activity recorded at multiple depths of rat motor cortex during reward-motivated movement to study their roles in motor learning. Because slow gamma (30–50 Hz), fast gamma (60–120 Hz), and theta (4–10 Hz) oscillations temporally coordinate task-relevant motor cortical activities, we first explored the behavioral state- and layer-dependent coordination of motor behavior in these frequency ranges. Consistent with previous findings, oscillations in the slow and fast gamma bands dominated during distinct movement states, i.e., preparation and execution states, respectively. However, we identified a novel independent component that dominantly appeared in deep cortical layers and exhibited enhanced slow gamma activity during the execution state. Then, we used the four major independent components to train a recurrent network model for the same lever movements as the rats performed. We show that the independent components differently contribute to the formation of various task-related activities, but they also play overlapping roles in motor learning

    Neural circuit mechanisms of hierarchical sequence learning tested on large-scale recording data

    Get PDF
    The brain performs various cognitive functions by learning the spatiotemporal salient features of the environment. This learning requires unsupervised segmentation of hierarchically organized spike sequences, but the underlying neural mechanism is only poorly understood. Here, we show that a recurrent gated network of neurons with dendrites can efficiently solve difficult segmentation tasks. In this model, multiplicative recurrent connections learn a context-dependent gating of dendro-somatic information transfers to minimize error in the prediction of somatic responses by the dendrites. Consequently, these connections filter the redundant input features represented by the dendrites but unnecessary in the given context. The model was tested on both synthetic and real neural data. In particular, the model was successful for segmenting multiple cell assemblies repeating in large-scale calcium imaging data containing thousands of cortical neurons. Our results suggest that recurrent gating of dendro-somatic signal transfers is crucial for cortical learning of context-dependent segmentation tasks

    Somatodendritic consistency check for temporal feature segmentation

    No full text

    Continual General Chunking Problem and SyncMap

    No full text
    Humans possess an inherent ability to chunk sequences into their constituent parts. In fact, this ability is thought to bootstrap language skills and learning of image patterns which might be a key to a more animal-like type of intelligence. Here, we propose a continual generalization of the chunking problem (an unsupervised problem), encompassing fixed and probabilistic chunks, discovery of temporal and causal structures and their continual variations. Additionally, we propose an algorithm called SyncMap that can learn and adapt to changes in the problem by creating a dynamic map which preserves the correlation between variables. Results of SyncMap suggest that the proposed algorithm learn near optimal solutions, despite the presence of many types of structures and their continual variation. When compared to Word2vec, PARSER and MRIL, SyncMap surpasses or ties with the best algorithm on 66% of the scenarios while being the second best in the remaining 34%. SyncMap's model-free simple dynamics and the absence of loss functions reveal that, perhaps surprisingly, much can be done with self-organization alone

    Intrinsic bursts facilitate learning of LĂ©vy flight movements in recurrent neural network models

    Get PDF
    Isolated spikes and bursts of spikes are thought to provide the two major modes of information coding by neurons. Bursts are known to be crucial for fundamental processes between neuron pairs, such as neuronal communications and synaptic plasticity. Neuronal bursting also has implications in neurodegenerative diseases and mental disorders. Despite these findings on the roles of bursts, whether and how bursts have an advantage over isolated spikes in the network-level computation remains elusive. Here, we demonstrate in a computational model that not isolated spikes, but intrinsic bursts can greatly facilitate learning of LĂ©vy flight random walk trajectories by synchronizing burst onsets across a neural population. LĂ©vy flight is a hallmark of optimal search strategies and appears in cognitive behaviors such as saccadic eye movements and memory retrieval. Our results suggest that bursting is crucial for sequence learning by recurrent neural networks when sequences comprise long-tailed distributed discrete jumps

    Modeling the Repetition-Based Recovering of Acoustic and Visual Sources With Dendritic Neurons

    Get PDF
    In natural auditory environments, acoustic signals originate from the temporal superimposition of different sound sources. The problem of inferring individual sources from ambiguous mixtures of sounds is known as blind source decomposition. Experiments on humans have demonstrated that the auditory system can identify sound sources as repeating patterns embedded in the acoustic input. Source repetition produces temporal regularities that can be detected and used for segregation. Specifically, listeners can identify sounds occurring more than once across different mixtures, but not sounds heard only in a single mixture. However, whether such a behavior can be computationally modeled has not yet been explored. Here, we propose a biologically inspired computational model to perform blind source separation on sequences of mixtures of acoustic stimuli. Our method relies on a somatodendritic neuron model trained with a Hebbian-like learning rule which was originally conceived to detect spatio-temporal patterns recurring in synaptic inputs. We show that the segregation capabilities of our model are reminiscent of the features of human performance in a variety of experimental settings involving synthesized sounds with naturalistic properties. Furthermore, we extend the study to investigate the properties of segregation on task settings not yet explored with human subjects, namely natural sounds and images. Overall, our work suggests that somatodendritic neuron models offer a promising neuro-inspired learning strategy to account for the characteristics of the brain segregation capabilities as well as to make predictions on yet untested experimental settings
    corecore